Entropy Increase as a Consequence of Measure Invariance

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neutrino Polarization as a Consequence of Gauge Invariance

It is pointed out that there are gauge-dependent and gauge-independent spinors within the little-group framework for internal space-time symmetries of massless particles. It is shown that two of the SL(2, c) spinors are invariant under gauge transformations while the remaining two are not. The Dirac equation contains only the gauge-invariant spinors leading to polarized neutrinos. It is shown t...

متن کامل

Poverty as a cause and consequence of Ill health

Background and aims: Poverty is a multidimensional phenomenon that can be defined in both economic and social terms. The paper attempts to review the existing evidence to understand the relation between poverty and ill health in the context of the limited conceptual and operational definitions of these terms. The paper uses two of Hills criteria- reversibility and dose response relationship to ...

متن کامل

Hidden Consequence of Active Local Lorentz Invariance

In this paper we investigate a hidden consequence of hypothesis that Lagrangians and field equations must be invariant under active local Lorentz transformations. We show that this hypothesis implies in an equivalence between spacetime structures with several curvature and torsion possibilities.

متن کامل

Entropy as a measure of quality of XML schema document

In this paper, a metric for the assessment of the structural complexity of eXtensible Markup Language schema document is formulated. The present metric ‘Schema Entropy is based on entropy concept and intended to measure the complexity of the schema documents written in W3C XML Schema Language due to diversity in the structures of its elements. The SE is useful in evaluating the efficiency of th...

متن کامل

Entropy as a Measure of Average Loss of Privacy

Privacy means that not everything about a person is known, that we need to ask additional questions to get the full information about the person. It therefore seems to reasonable to gauge the degree of privacy in each situation by the average number of binary (“yes”-“no”) questions that we need to ask to determine the full information – which is exactly Shannon’s entropy. The problem with this ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the American Mathematical Society

سال: 1985

ISSN: 0002-9939

DOI: 10.2307/2045611